Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory
نویسندگان
چکیده
منابع مشابه
Recurrent Neural Networks: Associative Memory and Optimization
Due to feedback connections, recurrent neural networks (RNNs) are dynamic models. RNNs can provide more compact structure for approximating dynamic systems compared to feedforward neural networks (FNNs). For some RNN models such as the Hopfield model and the Boltzmann machine, the fixed-point property of the dynamic systems can be used for optimization and associative memory. The Hopfield model...
متن کاملHigh Order Neural Networks for Efficient Associative Memory Design
We propose learning rules for recurrent neural networks with high-order interactions between some or all neurons. The designed networks exhibit the desired associative memory function: perfect storage and retrieval of pieces of information and/or sequences of information of any complexity.
متن کاملA Massively Parallel Associative Memory Based on Sparse Neural Networks
Associative memories store content in such a way that the content can be later retrieved by presenting the memory with a small portion of the content, rather than presenting the memory with an address as in more traditional memories. Associative memories are used as building blocks for algorithms within database engines, anomaly detection systems, compression algorithms, and face recognition sy...
متن کاملAssociative memory by recurrent neural networks with delay elements
The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper...
متن کاملSpatially-sparse convolutional neural networks
Convolutional neural networks (CNNs) perform well on problems such as handwriting recognition and image classification. However, the performance of the networks is often limited by budget and time constraints, particularly when trying to train deep networks. Motivated by the problem of online handwriting recognition, we developed a CNN for processing spatially-sparse inputs; a character drawn w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks and Learning Systems
سال: 2020
ISSN: 2162-237X,2162-2388
DOI: 10.1109/tnnls.2019.2899344